The "weight smoothing" regularization of MLP for Jacobian stabilization

نویسندگان

  • Filipe Aires
  • Michel Schmitt
  • Alain Chedin
  • Noelle Scott
چکیده

In an approximation problem with a neural network, a low-output root mean square (rms) error is not always a universal criterion. In this paper, we investigate problems where the Jacobians--first derivative of an output value with respect to an input value--of the approximation model are needed and propose to add a quality criterion on these Jacobians during the learning step. More specifically, we focus here on the approximation of functionals A; from a space of continuous functions (discretized in pratice) to a scalar space. In this case, the approximation is confronted with the compensation phenomenon: a lower contribution of one input can be compensated by a larger one of its neighboring inputs. In this case, profiles (with respect to the input index) of neural Jacobians are very irregular instead of smooth. Then, the approximation of A becomes an ill-posed problem because many solutions can be chosen by the learning process. We propose to introduce the smoothness of Jacobian profiles as an a priori information via a regularization technique and develop a new and efficient learning algorithm, called "weight smoothing." We assess the robustness of the weight smoothing algorithm by testing it on a real and complex problem stemming from meteorology: the neural approximation of the forward model of radiative transfer equation in the atmosphere. The stabilized Jacobians of this model are then used in an inversion process to illustrate the improvement of the Jacobians after weight smoothing.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Adding noise to the input of a model trained with a regularized objective

Regularization is a well studied problem in the context of neural networks. It is usually used to improve the generalization performance when the number of input samples is relatively small or heavily contaminated with noise. The regularization of a parametric model can be achieved in different manners some of which are early stopping (Morgan and Bourlard, 1990), weight decay, output smoothing ...

متن کامل

A Combined Smoothing and Regularization Method for Monotone Second-Order Cone Complementarity Problems

Abstract. The Second-Order Cone Complementarity Problem (SOCCP) is a wide class of problems containing the Nonlinear Complementarity Problem (NCP) and the Second-Order Cone Programming Problem (SOCP). Recently, Fukushima, Luo and Tseng extended some merit functions and their smoothing functions for NCP to SOCCP. Moreover, they derived computable formulas for the Jacobians of the smoothing funct...

متن کامل

MLP-ARD vs. Logistic Regression and C4.5 for PIP Claim Fraud Explication

In this paper we demonstrate the explicative capabilities of multilayer perceptron neural networks (MLP) with automatic relevance determination (ARD) weight regularization for personal injury protection (PIP) automobile insurance claim fraud detection. The ARD objective function hyperparameter scheme provides a means for soft input selection as it allows to determine which predictor variables a...

متن کامل

Image Smoothing and Segmentation by Graph Regularization

We propose a discrete regularization framework on weighted graphs of arbitrary topology, which leads to a family of nonlinear filters, such as the bilateral filter or the TV digital filter. This framework, which minimizes a loss function plus a regularization term, is parameterized by a weight function defined as a similarity measure. It is applicable to several problems in image processing, da...

متن کامل

On the Coerciveness of Merit Functions for the Second-Order Cone Complementarity Problem

The Second-Order Cone Complementarity Problem (SOCCP) is a wide class of problems, which includes the Nonlinear Complementarity Problem (NCP) and the Second-Order Cone Programming Problem (SOCP). Recently, Fukushima, Luo and Tseng extended some merit functions and their smoothing functions for NCP to SOCCP. Moreover, they derived computable formulas for the Jacobians of the smoothing functions ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • IEEE transactions on neural networks

دوره 10 6  شماره 

صفحات  -

تاریخ انتشار 1999